Trace-class Gaussian priors for Bayesian learning of neural networks with MCMC

نویسندگان

چکیده

Abstract This paper introduces a new neural network based prior for real valued functions. Each weight and bias of the has an independent Gaussian prior, with key novelty that variances decrease in width such way resulting function is well defined limit infinite network. We show induced posterior over functions amenable to Monte Carlo sampling using Hilbert space Markov chain (MCMC) methods. type MCMC stable under mesh refinement, i.e. acceptance probability does not degenerate as more parameters function's are introduced, even ad infinitum. demonstrate these advantages other priors, example Bayesian Reinforcement Learning.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Bayesian learning for a class of priors with prescribed marginals

We present Bayesian updating of an imprecise probability measure, represented by a class of precise multidimensional probability measures. Choice and analysis of our class are motivated by expert interviews that we conducted with modelers in the context of climatic change. From the interviews we deduce that generically, experts hold a much more informed opinion on the marginals of uncertain par...

متن کامل

Bayesian Active Learning of Neural Firing Rate Maps with Transformed Gaussian Process Priors

A firing rate map, also known as a tuning curve, describes the nonlinear relationship between a neuron's spike rate and a low-dimensional stimulus (e.g., orientation, head direction, contrast, color). Here we investigate Bayesian active learning methods for estimating firing rate maps in closed-loop neurophysiology experiments. These methods can accelerate the characterization of such maps thro...

متن کامل

Bayesian inference with rescaled Gaussian process priors

Abstract: We use rescaled Gaussian processes as prior models for functional parameters in nonparametric statistical models. We show how the rate of contraction of the posterior distributions depends on the scaling factor. In particular, we exhibit rescaled Gaussian process priors yielding posteriors that contract around the true parameter at optimal convergence rates. To derive our results we e...

متن کامل

Invariance priors for Bayesian feed-forward neural networks

Neural networks (NN) are famous for their advantageous flexibility for problems when there is insufficient knowledge to set up a proper model. On the other hand, this flexibility can cause overfitting and can hamper the generalization of neural networks. Many approaches to regularizing NN have been suggested but most of them are based on ad hoc arguments. Employing the principle of transformati...

متن کامل

On MCMC Sampling in Bayesian MLP Neural Networks

Bayesian MLP neural networks are a flexible tool in complex nonlinear problems. The approach is complicated by need to evaluate integrals over high-dimensional probability distributions. The integrals are generally approximated with Markov Chain Monte Carlo (MCMC) methods. There are several practical issues which arise when implementing MCMC. This article discusses the choice of starting values...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: Journal of The Royal Statistical Society Series B-statistical Methodology

سال: 2023

ISSN: ['1467-9868', '1369-7412']

DOI: https://doi.org/10.1093/jrsssb/qkac005